Skip to main content

tv   Cato Institute 2018 Surveillence Conference Part 2  CSPAN  January 8, 2019 1:39pm-3:01pm EST

1:39 pm
court, and public policy events in washington, d.c. and around the country. c-span is brought to you by your cable or satellite provider. >> president trump addresses the nation live from the oval office tonight about what his administration says is a humanitarian and national security crisis on the southern border. a disagreement between president trump and congressional democrats about funding a border wall has resultd in the partial government shutdown. tonight's address will be live at 9:00 eastern on c-span followed by a democratic response from house speaker nancy pelosi and senate minority leader chuck schumer. that will be followed by your phone calls. >> the cato institute held a surveillance conference last month in washington that covered a number of topics, including the growing network of security cameras in the u.s., surveillance of students and data collection by law enforcement agencies. this is about 90 minutes.
1:40 pm
>> welcome back. thank you so much. as i mentioned in my introductory remarks, there are so many fascinating issues surrounding surveillance, intelligence, new technologies that if we were to cover them all with the passion of the sort you just saw, this conference would last approximately three weeks, and because even i have limits to my capacity to focus for that long, we have for the last couple of years been inviting scholars and activists to present shorter talks but focus very tightly on a single subject and present work or analysis they have been doing in a way that allows us to get the sense of the range of hard questions we face as citizens
1:41 pm
and policy makers. our morning block of talks covers issues from facial recognition to social media surveillance to the global war on encryption on various fronts. i will very quickly introduce the speakers. if you want fuller biographies, look to the conference website on cato.org. you will find in addition to the agenda, links on the speakers' names for more extensive biographies. we will begin with analysis of recently passed legislation in australia that seeks to mandate law enforcement access to encrypted software, encrypted messaging tools. it's a first of its kind but could be a model for emulation elsewhere. i want to invite from new america, sharon bradford franklin.
1:42 pm
>> thank you. i'm with new america's open technology institute. if you had told me a year ago that i would be here today talking to you about australia, i would have actually thought you were joking, but i'm really glad to have the opportunity to speak with you today about the law just passed earlier this month in australia, and how this could actually allow the united states to look down under for an encryption back door. got to get the clicker working. here we go. so for those of you who may not actually be already familiar with the long-standing encryption debate, this is a battle that pits security against security. for years, the u.s. justice department and the federal
1:43 pm
bureau of investigation have been arguing that they are quote, going dark, due to the increasing use of encryption. they have complained that they can no longer access many electronic communications, even when they have a valid court order. many tech companies now have encryption by default and in their products and services, and they simply do not have access to their users' encrypted communications. the justice department and fbi want to require that tech companies guarantee that government has exceptional access or what they now have started calling the use of so-called responsible encryption. so that they will always be able to access even encrypted messages. otherwise, they say, they are hampered in their ability to keep americans safe from terrorists and other criminals. but security researchers, tech companies and privacy advocates have pointed out that this would amount to an encryption back door that could be exploited by others. there is no way to guarantee
1:44 pm
that only the u.s. government would be able to use any such mechanism. rather, this amounts to deliberately building vulnerabilities into products and services, and undermining device security for all would harm everyone's privacy and cybersecurity, and it would create new threats that we will all be victims of criminal activity. in addition, as we explored in a half-day forum last month, encryption protects economic security and the personal safety and freedom of journalists and individuals in vulnerable communities including victims of domestic violence. this debate which has been going on for years in the united states has now gone global, with a quick flareup down under in australia. this past august, the australian government released what they called an exposure draft of its telecommunications and other legislation amendment or assistance and access bill 2018.
1:45 pm
unlike the u.s. congress, which takes months and months or more likely years before it passes anything, the australian parliament managed to wrap up its consideration of this bill in a mere four months. following a public comment period on the exposure draft, a slightly modified version of the bill was introduced in parliament and referred to the parliamentary joint committee on intelligence and security, or pjcif. which opened a new public comment period. my organization, technology institute organized an international coalition of civil society organizations, tech companies and trade associations and we filed three rounds of public comments on the bill, outlining our concerns which i will describe in just a moment. the committee held a series of hearings and then just at the beginning of just last week, the pjcif issued a report recommending passage of the bill with certain amendments incorporated. early in the morning just last
1:46 pm
thursday, december 6th, the parliament released an updated version of the bill including 173 amendments that no one had ever seen before, but by the end of the day, the australian parliament had passed the bill into law. so what does the australian law actually do? as one australian commenter put it, quote, the combined stupidity and cowardice of the coalition on labor now means that any i.t. product, hardware or software, made in australia, will be automatically too risky to use for anyone concerned about cybersecurity. so we are focusing here on schedule one of the australian law, the one designed to undermine the safeguards of encryption. there are also, folks should be aware, other sections of the law that create additional privacy threats and increase the powers of government hacking but we are focusing on schedule one which relates to encryption. now, the law includes what appears to be an encouraging
1:47 pm
statement that purports to prohibit the government from demanding the creation of encryption back doors, and i have it up here on the slide here, section 317zg says the government may not request or require communications providers quote, to implement or build a systemic weakness or systemic vulnerability and also, that the government must not prevent a communications provider from rectifying a systemic weakness or systemic vulnerability. however, the law grants unprecedented new authorities to the australian government that undermine this promise. specifically, the law creates three new and powerful tools for the australian government. technical assistance requests or tars, technical assistance notices or tans, and technical capability notices or tcms. the questions are supposed to be voluntary whereas the notices are mandatory and the difference between the tans and tcns
1:48 pm
depends on which government official is authorized to issue the notice. all of these authorities authorize the australian government to request or demand any quote, listed act or thing. that's a long list in the bill and it includes things like removing one or more forms of electronic protection that are or were applied by or on behalf of the provider and it also includes modifying or facilitating the modification of any of the characteristics of a service provided by the designated communications provider. in short, these are powers to demand that tech companies weaken the security features of their products. for example, the australian government can now make the same request to apple that the fbi made in the 2015 san bernardino shooter case, that they build a new operating system to circumvent iphone security features.
1:49 pm
as apple explained in that case, building the requested software tool would have made that technique widely available, thereby threatening the cybersecurity of other users. as we know, in the lawsuit here in the u.s., the united states government argued that under the somewhat obscure all risk act which dates back to 1789, they were permitted to make this demand of apple, but apple, supported by other tech companies and privacy advocates, argued that this demand was unconstitutional. the justice department ultimately withdrew its demand because the court -- before the court could resolve the legal question because the fbi was able to pay an outside vendor to hack into the phone. but in australia, they now have a specific authority to make these kinds of demands. another worrisome scenario is that australia may seek to use its tcn authority in the same way the united kingdom is
1:50 pm
looking to use its new powers, excuse me, its powers. just last month, levy and robinson of the uk's gchq, essentially their nsa, put out a proposal under which tech companies would be asked or required to add gchq as a silent participant in end-to-end encrypted chats and the tech company would suppress the notification to the user. they argue that quote, you don't even have to touch encryption to add gchq as a ghost user inside the encrypted chat. there are several other threats posed by the new australia law's approach to encryption. in our coalition comments, in addition to explaining the breadth of the new powers created by the bill, we also address three other key concerns. first, the law lacks any requirement for prior independent review or adequate oversight. many features of australia's new
1:51 pm
law such as the authorization for technical capabilities notices, were modeled on the uk's investigatory powers act. the law also raises threats to digital security and human rights but section 254 of the uk's act does require the judicial commissioners must review and approve proposed technical capability notices before they may be issued, although we still have questions about the adequacy and independence of this review under the uk law, australia's tcn authority poses even greater threats to cybersecurity and individual rights, because there's no provision requiring any type of prior, let alone independent, review. in addition, australia has no bill of rights so while the procedures through which -- while there are procedures through which tech companies may challenge government requested orders, these challenges will be more difficult. tech companies will not have the same legal arguments available to them based on protecting
1:52 pm
individual rights as they would in countries like the uk and the u.s. second, the law requires undue secrecy. although the law requires statistical transparency reporting by the government and permits statistical transparency reporting by tech companies, it also includes very strict non s nondiscloseure requirements. violation is a criminal offense punishable by up to five years in prison. there are no limits to the duration of these gag orders. such as we have here when the reason for confidentiality no longer exists. third, the law's definition of covered designations communications providers is overbroad. it includes anyone who provides an electronic service that has one or more end users in australia. so this means that any tech company doing business in australia or anyone providing electronic services in australia
1:53 pm
is subject to government demands that they weaken the security features of their products and services. so this is bad for australia but what does it mean for us here in the united states? well, australia's legislation appears to be part of a coordinated effort by the five is alliance. for those of you not familiar with that term, the five is is an intelligence alliance comprised of australia, canada, new zealand, the united kingdom and the united states, that dates back to world war ii. since 2013, these five nations have also formed a five country ministerial which is an annual convening on strategy and information sharing on law enforcement and national security issues. for the past two years, the five nations have focused on strategies and policies to weaken encryption. just this past august, august 2018, the five countries released a statement on principles, on access to evidence and encryption, and
1:54 pm
that statement includes that if these governments continue to quote, encounter impediments in their efforts to access encrypted communications, they may pursue legislative mandates for encryption back doors. the very same month that that statement came out, australia released the exposure draft of its encryption bill. so now australia's law can provide the united states and other governments with a back door to an encryption back door. australia now has the authority to compel providers to create encryption back doors and once providers are forced to build weaknesses into their products, other governments can exploit those weaknesses. i already mentioned the example of apple versus fbi. now if australia issued a technical capability notice to compel apple to build a new operating system to circumvent iphone security features which is what the fbi demanded in the san bernardino case, then if
1:55 pm
apple complied and built that system, it could no longer argue that it lacked the capacity to turn over data to the u.s. government in similar cases. similarly, if australia forced facebook to re-engineer encrypted chats to be accessible in response to australian legal demands, those chats would also be vulnerable to other governments' demands. finally, there is of course also a risk the u.s. government to seek to expand its own direct authority by pointing to australia as the new model for quote, responsible encryption legislation. so whether it's as a pathway or a model, the australian law creates risks to cybersecurity and privacy that extend well beyond australia's borders. thank you. [ applause ] >> thanks so much, sharon. next up, the french philosopher
1:56 pm
known for his analysis of the tight link between surveillance and training or discipline, his book loosely translated in english as "discipline and punish" which would be equally translated "to surveil and to punish" so very naturally, close monitoring is always a key part of training and indoctrination, no surprise children are often closely monitored as we are teaching them, that is perhaps an inevitable part of raising children safely, but it also means we need to worry about whether we are training them for compliance with surveillance, as the technological capability to monitor children ever more
1:57 pm
closely becomes both a reality and widespread in use, i often wonder whether we are preparing children to accept as normal a world in which everything they do is closely scrutinized. to look at one aspect of that, the social media surveillance of students, i want to invite rachel walters of the brennan center. >> great. thank you so much. that's really the perfect introduction. i will be coming back to exactly that point near the end of my presentation. i'm rachel levinson waldman from the brennan center for justice. i am going to be talking about social media surveillance of students and especially k through 12 students. to start this off, i want to talk just for a moment about the prif le prevalence, the deep saturation at this point kids have online.
1:58 pm
according to a pew internet study from last month, 97%, 97% of 13 to 17-year-olds in the u.s. have -- are on at least one major online social media platform. 95% of american teens have access to a smartphone and 45% say they are online almost constantly. so there is clearly a lot of content out there and a lot of time that teens and even younger kids are spending online. with that social media presence comes social media monitoring, and these tools are sold for a variety of purposes. they are sold as preventing bullying, preventing school shootings, potential suicide and other online threats and maybe not surprisingly, they are also big business. spending by public schools nationwide on nine major social
1:59 pm
media monitoring companies, you can see here, you can see that there are sort of some spikes, some, you know, mountains and valleys, but overall it's this pretty massive increase in spending starting in 2010 and going up to the spike, there are spikes in 2015, again, '16 and '17 and this big spike in the summer of 2018, potentially driven by the shooting in parkland, florida. but public school districts are spending more and more money on automated social media monitoring tools. this is similar reflected by keyword searches. so searches for social media monitoring in contracts between public schools and private companies and again showing these spikes over the last several years, really significant increases and then a major spike in 2018. so increasingly, a lot of public money is being spent on these services. now, based on these statistics, you might think that schools are getting more dangerous but in
2:00 pm
fact, the opposite is true. schools are actually getting safer. while it's true that this country has a unique risk of school shootings among developed countries, and while obviously a single shooting or even a single serious bullying incident is one too many, the overall crime decline in this country holds true in schools as well. ... to contest the odds of choking are one in 3400. in 199510% of students, age 12 through 18 reported being a victim of a crime at school in the previous six months. 2015, 2016 is clear just 3% of students did. and that 20 year. it went from 10% down to 3%. major decrease. in general over the last two decades less than the present of
2:01 pm
youth homicide is the% of youth suicide have occurred at school. of course part of the hope with social media monitoring may be will pick up the risks off of school grounds but by any measure school is a pretty safe place to be. the one state in the country that has legislated social media monitoring is florida. i'm sure everyone is familiar with the shooting in parkland, florida last february. niclas cruz, former student shot and killed 17 students and staff members and injured 17 others. in the wake of that shooting the florida legislator passed a law that included the creation of an office of state schools within the state department of education. that office is required to coordinate with a florida department of law enforcement to set up a centralized base to facilitate access to a pretty wide range of information including social media data. legislation also established a
2:02 pm
public safety commission which recently recommended the department of protocols around social media monitoring. doesn't look like that has begun right yet but it is likely to do so in the new year. as it turned out niclas cruz had posted online about his intentions before the shooting. people have taken notice that he was reported to the fbi and local police three times for disturbing but one call to the fbi warned that he might become a school shooter and a separate call flagged a youtube post in which the user had said he wanted to become a professional school shooter. although poster was not edified as crews until after the shooting. while there is certainly warning signs on social media it wasn't the case of the district flying blind. people were seeing those warning signals and trying to act on them. what really failed the students was not a failure to see those posts but according to a review
2:03 pm
of the school district actions that came out in august it was more the district itself had failed at nearly every turn to provide crews with the educational and support services he needed. nevertheless florida is working on a first of its kind national experiment when it comes to social media data. it's a big question which is but why not? if a single school shooting is 120 and a social media monitoring could catch one feature niclas cruz and could catch one future suicidal student why not do it? if the states are that high, what is the harm? a lot of reasons to be very cautious about this monitoring. the first is a real concern about the accuracy of social media monitoring tools. this plays out in a couple of different ways. one way these tools can be inaccurate is through overreach.
2:04 pm
the fact they are likely to pull in much more information than it will be useful. police in jacksonville, florida set up a social media monitoring tool to search for keywords that were going to be related to public safety or that might indicate some risk of criminal activity. one of the words they set up was the word bomb thinking if there was a bomb threat it would turn it out. it turned out there were no bomb threats flagged online but instead it was inundated with posts about things like pizza that was the bomb and photo bomb. a lot of stuff coming in a very little use. the second issue is under reach. by which i mean that the kinds of risks social media monitoring tools would like to find often are going to appear online at all. imagine earlier niclas cruz had posted online about detentions and people had reported that. it's not clear there with the
2:05 pm
extra value of monitoring software would have been. as it turns out to some extent he was the exception. brendan center did an informal survey of major school shootings and unfortunately that's a category, major school shootings since the city hook shooting in 2012. there was only one other perpetrator according to public reporting that a put up social media postings that strongly indicated an interest in school violence. that was adam lanza, the shooter in sandy hook. he posted in discussion forums about the columbine high school shooting and he operated tumbler accounts named after school shooters. these postings were the secret and while they may not have known at the time it's hard to imagine they would not be directed to authorities in one
2:06 pm
the online profile and major school shootings which again get a lot of report after the fact don't show anything that would like them for an automated tool. for instance, perpetrator of a 2014 shooting in troutdale, organ had a facebook page showing he liked first-person shooter military themed games like call of duty and liked various knife and gun pages. in retrospect sure, they seem like warning signs that something was going on but in fact the official facebook page for call of duty, world war ii had nearly 24 million followers and the remington arms facebook page has over 1.3 million mics so sending up a red flag of everything a person who enjoys these pastimes would create a
2:07 pm
huge quantity of noise for very little signal. whether or not automated social media built in shortcomings. no one there's a great report on mixed messages which is a lot on research and as their research shows monitoring tools generally work best when the poster in english and when the tools looking for concrete. they can be full by legal, part pop-culture references and the best example comes from the 2015 trial of the boston area marathon -- during the child to this evidence several quotes from his twitter account to show he had was a famous and he was out falling. he had tweeted and this is one of the things peter brought up a quote that said i shall die young which may be suggesting something about intent but a quote from russian pop song and
2:08 pm
linked to the pop song in the tweet. the agent had just not clicked on the tweet to see it was a song lyric. other quotes from jc songs in south park episodes among other things. social media is incredibly contextual and neither automated tools are often human analysts are person out that contact. second major concern is the risk of determination and this comes in two forms. the first is that the keywords themselves the tools will be set to flag on will be discriminatory. they found that when they set up a social media monitoring tool the hashtags that were black lives matter, ferguson, muslim life matter and the arabic word for community. needless to say they were in science of a public safety
2:09 pm
threat. these tools are only as good as the people who are using them and a lot of ways to use them to further underscore a tory mindset. there is a risk of discriminatory impact so whatever keywords are flagged will be a huge amount of discretion in what is done with the result including was punished by the school. we already know students of color at every level of schooling experience harsher discipline than white students even for the same infection and even when they commit infections at lower rates than white students. there's a real concern that social media monitoring could contribute to the school to prison pipeline. i suspect we'll hear from the muslim teenager who brought the homemade talk to his dallas area
2:10 pm
high school and was then arrested on the suspicion of concealed a bomb. he was well known at a school for bringing electronics, tinkering, fixing other people's electronics and told his teachers and the principal repeatedly that it was in fact, a clock. to raise suspicions that the scrutiny he was put under an ultimate arrest was essentially grounded in islamaphobia. social media front in alabama high school paid a former fbi agent to go through student social media accounts on the basis of anonymous tips. the district ultimately expelled over a dozen students on the basis of what he found online. eighty-six of the students expelled were black even though blacks made up 40% of the student body. but not surprisingly they were mistakenly identified as posing a threat because of their one connecticut teenager posted a
2:11 pm
picture of a toy airsoft gun that was about a real rifle. in terms of whiteness when he put it up he thought it was awesome and his friends would think it was awesome. another student the post was worried about it so he reported it to school officials. this does not necessarily strike me as a crazy thing to do although as zach noted if officials had googled the name on the site of the gun of the manufacturer they would've seen it was a toy gun even though it did bear a resemblance to a real one. instead of discussing it and resolving the issue potentially with lessons about responsible social media use and taking before you post he was not only suspect suspended for the day but arrested for breach of peace and a misdemeanor offense. because it's so hard to reliably pinpoint individual social media posts that actually indicate some kind of life threat
2:12 pm
monitoring companies have a perverse incentive. an incentive to some extent to sweep up everything so they can assure their customers that they can spot that needle in the haystack. at the same time they are very little liable way of gauging their effectiveness. 2015 west asian by christian science monitor revealed that none of the three major school social media monitoring companies failed to do had firm metrics for measuring effectiveness. at least one said basically me know that we succeeded when we get a call from the school saying something we sent them was interesting. it's a perfect storm for a mindset of more, more, more. at the same time parents and students often know very little about these tools. research shows a social media monitoring companies may assume the students are being trapped by virtue of posting public sites students more often
2:13 pm
believe companies are prohibited from sharing personal information with third parties. there's a real lack of information about how these programs operate or rather their asymmetrical information. finally this goes to julian's point at the beginning. it's worth thinking about what it means for students to be under constant surveillance online. as a practical matter they may stop posting start posting less or more forums which will simply want any effectiveness of these tools would have had. maybe more concerning me it teaches students to expect surveillance. and even to anticipate and authority figures opinion and react accordingly. some of this you can say is good, digital hygiene. something we post publicly we need to think before you post about what that looks like, who might be at who might see it in the future. it's not clear that it is healthy for students who are learning about citizens roles in
2:14 pm
a democracy to know they are under that surveillance all the time and to be acting accordingly. what does this mean? at the very least before school or school district was out of a social media monitoring program it's incumbent on officials with the cost and benefit and to involve parents and students in a frank discussion of what it means. they decide not to set forth on a monitoring program they should remember they are most likely not going dark but a lot of concerned people out there who will spot post and fight them. thank you so much. [applause] >> thank you, rachel. it does remind me of an acquaintance who is a science fiction writer is more optimistic view of this. it's great because we are training our children to develop
2:15 pm
habits of fairly counterintelligence tradecraft just to be able to have a normal childhood. next generation will be sophisticated about the meeting surveillance and i suppose we will find out. two topics a focus on privacy and public in a sense the myriad of ways that just walking down an ordinary city street we are of being observed in ways we may not recognize in the ways existing networks for up surveillance can be transformed in fairly deep ways by existing in the structure becoming a platform for new methods of monitoring. first of these will be an examination of camera networks for facial recognition surveillance from jake -- of the government oversight.
2:16 pm
>> thank you for having me here i'm a senior counsel at right focus on surveillance issues and facial recognition how cameras and various aspects can empower and grow facial recognition surveillance into dragon notes. just a quick start about facial recognition surveillance itself is no longer a sci-fi technology of minority report that we will see in the distant future. it is happening now. fbi conducts over 4000 facial recognition searches every month on average and coordinate with local and law in state state --
2:17 pm
customs and border protection has a biometric exit program that uses facial recognition for outgoing flights and plan to spread this to airports in general as well as seaports and imports across country. i.c.e. is looking to buy facial recognition technology as well. that is the facial recognition and a very light and real surveillance are. facial recognition focuses on three key factors to be a powerful force for surveillance but first you need a database of photos that are identified for people and they have half of all american adults in a photo database. you need very powerful software technology that can scan across hundreds and millions of photos and scan pieces rapidly, all the companies are developing the company. third what i want to focus on you need a network of cameras that you can tap into unless you can use to see people's faces
2:18 pm
everywhere all the time. there are four areas where this you have the potential to build these camera networks. first, government surveillance cameras, ct bb and second police body cameras, third privately owned security cameras and last, social media databases. let's start first with government surveillance progra programs, cpb programs. a decade ago chicago mayor a daily said he expected one day we would have a police camera basically on every corner. i want you to keep that quote in mind as we talk more and more about cpt be but first let's go to where we truly have photo dragnet and where we achieved big brother status. that is in china. china is by far the most powerful network of government surveillance that we can see and they have an estimated 200 million around the country
2:19 pm
and the effect are profound. if you look at cities they are incredibly dense and powerful. beijing maintains 46000 cameras that blink at the city state media the police in beijing both that this network allows them to have one 100% coverage of the city and see everything that is going on all the time. this can have a powerful impact recognition. recently a reporter asked to test the system and went to a city for at the point finally people and gave his photo to the government to put into a system and asked them to find it. using the cameras and their systems the automated spatial recordation tracked him down and found him that the entire city of three people in a mere seven minutes. that is surveillance but see cpb's american strong to agree.
2:20 pm
party in new york and chicago and washington and in new york there is a there's a domain where the system and it can be subject to real-time viewing analysis and other tools with facial recognition could become one of those in the future. oakland considered voting its own domain awareness and use by government involving everything from port authority to the police cars to cameras outside schools. smaller cities such as st. louis and new orleans they are used to watching hubs and the city with the largest by far the same network in the united states is chicago. chicago is closest to achieving big brother status in america. maintains a plea surveillance network of cameras that are over
2:21 pm
30000 total cameras in the city. this in some ways surpasses the level of surveillance dragnet you'll see in china although 30000 cameras in chicago is less than the total 46000 in beijing and if you look at area density for cameras a 128 that covered one 100% of the preparation but this can have powerful -- we are seeing this in america primarily they currently the way the system works is you have skin is scanners of the city and they will scan pieces and flag any persons of interest whatever persons of interest means. that is government cc pb.
2:22 pm
next, i will get police body camera area of risk in terms of establishing video surveillance dragnet in the united states. the simple reason for that is body cameras are becoming increasingly popular in america and american police department. [inaudible] this is not a huge surprise because axiom offers their body cameras for free so long as you then use their storage system. studies of recent years of police departments indicate 97% of the largest place apartments all either have body camera programs in place or are in pilot or testing stages or if they don't have them yet will fill them in the future. going to be a universal phenomenon of police wearing body cams and on our streets as
2:23 pm
cops walked by. why was it a big deal for proliferation surveillance cameras because cities have police in them. on average localities have between 16, 24 police officers for every 10000 residents but when you look at big cities this amount gets much higher. plenty of cities as many as or more over 50 if you look at area density you can see some cities are ablated with police office officers. for example, ten different cities have over 20 police officers for every square mile topping of the list is new york city which has one over 100 police officers for every square mile. in terms of facial recognition axon recently backtracked put facial recognition in their body cameras and is flawed and [inaudible] the scrap plans that
2:24 pm
might happen soon as this year to put facial recognition in the system but not all vendors are taking this cautious approach. it's only a matter of time before companies like exxon are satisfied that is good enough for their work and begin to institute it. and axon bp describe their interest in body cameras a couple years ago by saying that by putting facial recognition and body cameras with a every cop in america will be robocop. this is worrying because while virtually all police departments are charging ahead with police body cameras very few are setting rules and standards for facial recognition. according to a scorecard on body cameras maintained by upturn in the leadership conference basically no cities that operate by the camera programs have affected rules and facial recognition and that is many cities that are not acting with
2:25 pm
appropriate standards. that is police body cameras. next i want to talk about private surveillance cameras and government surveillance networ networks. [inaudible] is similar to -- in another way government can potentially build out surveillance video surveillance and works but do so with very little work about the info structure and at a fraction of the cost. we may not have the 200 million surveillance camera that china does but america does have over 30 million privately owned security cameras throughout the country. given that the potential to simply tap into these instead of building your own cameras it is no surprise government they want to turn this into this. a couple of those cameras are amazons ring doorbell. the video durable system. just a site news broke that amazon had patented technology
2:26 pm
to build facial recognition into those burbles. connected to please networks. and notify them when anyone suspicious comes up. another fun innovation from amazon. lease departments are not just going with this idea but proactively soliciting and -- whereby those cameras can be accessed and readily used by law-enforcement and video surveillance networks. i mentioned new york before and the domain awareness system they have there that allows real-time streaming video cameras and of the 6000 cameras connected to new york's network two thirds are privately owned cameras that have agreements and allow the new york post permit to ask a and use them. washington dc a lot of cities offer incentives to try to get people to hook up their service cameras and for here example you see the browser same please
2:27 pm
purchase security cameras and connect them to our networks. we will pay you to do this. that is private security cameras. it's similar to government cctv a network of stable cameras that were eventually provide a dragnet that could [inaudible] this is simply another way to build out the pretty severe risk given that we don't have the option to stop government from building these cameras. the cameras are there and work for rain potentially having law-enforcement tap into them. let's talk social media photos. it's a different vein and that were not talking cameras taking images but images that are being stockpiled. on the left social media photos are potentially the greatest risk in terms of a photo dragnet that could be used or crafted by government for facial
2:28 pm
recognition because of the sheer size. we've already seen facial recognition to a limited degree by the firm -- they got caught and admitted during protest had run social media photos through spatial recognition technology during protest in baltimore to find individuals with outstanding warrants and broccoli rest and remove them from the crowd. luckily when this came out as a product of aclu research they responded properly and blocked and shut down the access to their services and it's important that social media companies continue to be vigilant and to limit their api and prevent photo databases from becoming a means of facial recognition surveillance. it's important the companies start to think about not just harvesting on their platforms and openly through access but about court orders and those means. the scene things like this in
2:29 pm
the recent past. couple years ago yahoo received and complied with a court order asking that they scan all e-mail content into their database for specific content the government was looking for but it's not hard to imagine the government coming with someone that maintains photo databases to find it particular face. [inaudible] these companies maintain a very large photo database and google has over 200 million users in its cloud photo service including 24 billion cell fees, facebook has over 250 alien photos uploaded every single day. it would be great as these companies continue to build out what our already fantastic surveillance conspiracy reports are getting better all the time thing about including a dash for facial recognition so the
2:30 pm
government does with this broad excessive thing we want to start scanning all your photos for facial recognition purposes we will get the heads up and start acting. with that i want to conclude by talking about what actions we take if we start to see these activities and how we should respond. ... this is an effort to improve transparency and limit surveillance properly in cities all across the country. i'm sure as that campaign goes on and will continue doing a lot of great work to limit the bitter surveillance at limit
2:31 pm
advanced surveillance tools like facial recognition being built into cameras. on the federal level of a lot of potential in terms of limiting and conditioning. we talked about fund for local government cctv networks don't come from those localities. they come from the federal government. doj found cctv. orlando which is running a cctv real-time facial recognition network originally received funds for cctv on the department of justice. it would be great if in the future when doj handed out funds for cctv surveillance networks they said you cannot use this for facial recognition or set strict guidelines of limits on how it could be used. dhs fun surveillance cameras for cities on a large scale as well. again this is another opportunity for setting strict rules, guidelines and limit would be a very effective way from stopping these the assertiveness networks are being
2:32 pm
turned into mass facial recognition location tracking and scanning networks. finally the department of justice also issues grants intends that millions of dollars every year for police body cameras but again we do not see virtually any departments put it in good rules for facial recognition on body cameras. it would be a vast improvement if when doj was handing out its grants they said you need to put an effective rules and guidelines and limits to protect privacy before we give you all this money. so those are some actions we should take. i think it is very important we take now because we are very quickly approaching the point where we are all going to on a daily basis in much like that bbc reporter, track down through an automated computer system that is being monitored with 1 million little eyes. thank you very much. you can read more about our work at pogo.org and looking forward to the rest of the conference. [applause]
2:33 pm
>> the classic feature of surveillance makes it a mechanism of power is an equal to me. jeremy benson and octagon, the prisoners and the ultimate surveillance prison knew that they were under potential obligation. they can be seen that can't see that you would. when it comes to public networks of cameras monitoring as, one of the most effective things we can do in order to encourage people to react to the changes that happening around them is to be aware of them. i'm just passing by a tool, the electronic frontier foundation has developed to recognize the ways in which surveillance in public is exploding around us. to talk about that i want to invite david moss -- david
2:34 pm
>> thank you for having me today. my name is dave maass come with the electronic frontier foundation, based in foundation. we've been around since 1890 and we make sure rights and liberties continue to exist as our societies use of technology advances. i particularly work on efs street-level surveillance project which aims to ensure there is transparency, regulation and public awareness of the various technologies that law enforcement is the point in our communities. a lot of times that work looks like filing public records request. so, for example, with license plate readers, eff teamed up with the organization to 500s and hundreds of public records request around the country to find out how law enforcement agencies were sharing license plate reader dated amongst
2:35 pm
themselves. or let's say drones. will file a public records request for mission log reports on how come show a uc berkeley police use drones to his available protesters in 2017. or file a public records request with the san francisco district attorneys office to get a spreadsheet with the geo locations of every surveillance camera in their database. similar to what jake was just talking about. this is all a problem because too often our work looks like this. we are checking public records of people saying surgical come his documents and document cloud or here's a white paper we wrote, or a 3000 word blog post. or even worse it sneezed and if i were you doing a powerpoint presentation, and if we're lucky i have a funny car t to do with it. i don't have one today so i had to get this one. really our work should look like this to the public. i contextualized within their community. if i could, i would run a
2:36 pm
walking tour company where i could take people around and show them the various surveillance technology around them. i'm a very busy person and it don't know that doing to a groups of six or seven people is the most effective way to get our message across. however, maybe this concept can transfer over to something like virtual reality. taking a step back, with the cut virtual reality and local law enforcement technology, police already working on virtual reality stuff. this is a company out of georgia called motion reality that has a warehouse sized space where police officers put on virtual reality helmets, they are given real feeling, real, fake electronic firearms and they are wired up head to toe and they going to run scenarios, and that can be replayed back so they can see what they did right, what they did wrong. one of my favorite things about
2:37 pm
this is they are also covered in i guess electrodes, so if earshot they get shocked and demobilized in that part of the body. there's a company that has taken one of these and modified to work as a replacement for fuel sobriety tests. so the whole flashlight thing would happen within a b.r. visor. there's a surveillance aspect. this is something called out imaging and it is a little ball covered with all cameras and a swat team officer might check that into a hostage situation or whatever and that if somebody d sit outside in virtual reality looking around before they go in and then recording a 362 of everything going on. what can we do on the other side vr? let me give you quick background, a little brief of our organization in vr. this is one of our founders, both a a lyricist for the gratl dead as well as a digital
2:38 pm
pioneer. in 1990 he wrote an essay in which, , after you gone and visited some of the early vr companies in québec and use base. he thought is a psychedelic expert. he thought a lot of things were psychedelic experience back in because i think is on psychedelics quite a big chunk of the time. this was next big thing, welcome to virtual reality. we have left through the looking glass. 25 years novel lot happened since then but in 2015 we find so vr start to move towards a mass commercial market. this was the oculus, abide, the playstation. they'll cannot early 2016. for our organization there were two big questions we were looking at. first of all what are the digital rights implications of virtual reality technology in our society? and what is the potential for virtual reality as an advocacy tool and an educational tool? start with what i think of as a privacy element, the intercepted a a great piece in 2016 about
2:39 pm
hypothesizing the virtual reality might be the most that there is kind of digital surveillance with regards to the internet yet. i tend to agree. this voiced a lot of the concerns i was having an weird talking amongst ourselves and we've not seen it floated publicly yet. the reason is biometric. virtual reality tends to rely on our physical characteristics in order to function. on a very basic level that is how your head is moving, the distance between your hand and her head, how long your arms are, if your left or right handed or even something so simple as how your head is moving in a virtual reality environment can be correlated to mental health conditions. more advanced vr technology is starting to involve devices the measure your breath for map out your facial expression and that's a whole other world of biometrics. one of the creepiest things is when you have companies that in order to gather sort of reaction
2:40 pm
will biometrics are throwing stimulus at you and fairly quiet manner without saying why so they can find something measurable impact respond to it. we will not get too much into augmented reality but double also present even more problems because a lot of devices are scanning the world around you in order to produce content. something interesting as well is there was a research study by the extended mind that found current state of play, 90% of the the our users are taking steps to protect the privacy, whether that is adjusting to facebook settings are using an app blocker. while three-quarters of users were okay with companies using their biometric data for product development, the overwhelming majority was very much opposed to the biometric information being sold anonymized or not to other entities.
2:41 pm
as far as vr as an embassy to we're not the first one to try this. planned parenthood as an expert is called across the line that puts people in the position of a woman trying to seek reproductive health services at a clinic that has a lot of angry protesters there. peta challenge people to step inside a factory farming situation, what is it like to be a cat at a factory farm or a chicken? then there's some groups out of brookline, massachusetts, that work with united nations environmental assembly to do virtual-reality visualizations of data on air pollution. they took that and around that through a bunch of u.n. delegates in nairobi. that brings us to eff spot the sopranos project. this is at its base virtual
2:42 pm
reality experience that uses a basic simulation to teach people about the various spine technologies that police may to put in their communities. when we were pursuing this in the early stages we had some considerations. wanted to be a meaningful as the speedy experience, we want to not collect biometric information to . we wanted it as an organization that supports open source to tech told we want to make it work on multiple platforms and not just the oculus. we wanted it to be also bunch on a modest budget because we are a nonprofit and we're not sony. when i see meaningful advocacy expert we did want to reply on the novelty factor of the faq to basically take anything and put in vr and if it is some only fe using the viewer it would be like this is amazing, regardless of what is. we wanted to make sure ours was presenting research in a a way that only dra could allow. we didn't want people to do watching a movie in vr. we want them to be doing something interacting with the
2:43 pm
world and to be challenged by it. we wanted people to learn information that even though they were expressing it in a virtual world we want them to carry that back to the real world. so the concept is somebody come when she but that's it on and people of demos during the lunch break but you can put on, your placed in a street scene, it's in western addition everett of san francisco. there is a police in canada going on between a young citizen and to make officers. you look around and as you find something you get a pop-up and a voiceover explaining what it is. it's not that you are quickly can you go through it and score points about the surveillance technology. it is supposed be an educational tool. there are four goals, one is can we a virtual reality? can we do a virtual reality experts cheaply and the first amid we could do other things down the road. number two was whether just to educate people about the forms of surveillance. then we also wanted to help them figure out where they are in
2:44 pm
their communities. finally, we had this thought police in canada are very stressful situations, protests are very stressful situation sometimes. things move very quickly but can be useful for people to take note of what services technology they saw in those seats. so perhaps by putting people in a simulation in a controlled environment weather able to gain practice looking for these technologies, it might carry over to these higher stress situations. so we decided not to go with a computer-generated and five at, and just go with a 360-degree photo. this is the rico state of vp or it's also on the screen. it's got two concave lenses one on each side and it captures just beyond 180 degrees on each side and stitches in together. you're able to take up of everything. if i use right now you would get all this got all of this. the only thing you might not get is the very base of the tripod
2:45 pm
underneath the camera. this helped us get past what people refer to as the uncanny valley when it comes to video games. the more you try to create a realistic person or environment the more creepy it is to people. by using actual photo with the real scene with a few things photoshop and, , it bypassed tht altogether. this is what the photo looks like that we took. it's obviously what you into virtual reality headset it wraps all the way about you. you can see there is a scenario there going on and you can kind of see us at the bottom. i will show you. this is what it looked like. you don't see this in the game. we were hiding under this, and longer version of the full the wind about this item hiding there outside this police station hoping police would come outside. eventually they did and it being san francisco they didn't question two people with a wood piece of technology on the street. [laughing] which was great because it was
2:46 pm
kind of the perfect shot for us. for those of you will not have a chance to try to do this is what looks like if you looked over at the body cam, you get a pop-up about it that explains what it is and has a voice over. we come it such a visual medium we did wanted to just be that she had to be fully cited to enjoy this experience or to learn from it. if you are only able to see out of one eye or you have limited visibility, but you have a certain amount of awareness of and a vibrant you can go in and still learn things through audio. we did made a launch on novembe. this is at the internet archive at the air force and hackathon. that's brewster kill, the founder of the internet archive testing it to we're looking at having tables like this. this is most, there's not a lot of at this point not a lot of people have these devices in their homes, even though this one drop down to $200 recently.
2:47 pm
not a lot of people have it but it is something we can take to conferences, we can have our grassroots activists when they're going to visit groups bring with them just like they would bring one pagers or brochures for things like that, they could bring one of these with them. we've run it through probably 500 people in the last month which if you think about it in terms of an activism organization, if you can spend nine minutes with somebody getting them to only exclusively focus on surveillance, that is incredibly, , that's like thousands, that's a lot of time. but it was available on the internet and one of the things i found gratifying is portland, maine, is about as far from san francisco as you can get. lasting and the united states but we see that our hacker spaces and being labs that are trying to set in having people demo it. we started to social media respond to it as well. my favorite tweeters and this one in the middle.
2:48 pm
aloe alsop is exactly what we were going for with this. i feel pretty good about that. i force next up for us, we are still in beta most of the continually doing demos together use feedback. we've approved experience different things working open-source technology is sometimes that might be a tweak in the language and everything breaks. we've had bugs come up and we have to fix them and we need to get everything stable for april 2019 lunch. once we have that we will start sending out to community is, maybe come up with an educational curriculum so teachers can do it but after that went to look at what the next version of this project to be. we have a few ideas some of them are let's doing it and that i think subversion, let's home office where you see the net, a printer, all these which might be surveilled by a divisive deal. or we do one like people want to know what it's like an eye for what it's like in york city so
2:49 pm
maybe we build the same thing for various areas. or maybe we abandon the are altogether and go on to ar and we have way for people stones to project things for them into the world. all these developed of the technology develops, what interest we get into it, whether there's a return on investment, looking can agreeance there are. it's a new world and we don't know what is going to be in the we don't know where it will be in five years. but i get you a know we will be after lunchtime, that is just outside the lunchroom where you can contract out. i've have traditionally had a camera works or anything like that. that is all i have. if you do have headset at home or you just want to play around with on your computer browser it is eff.org/spot. [applause] >> i love this idea. i concept, tetris effect, the idea that when people play games especially if it involves a repetitive pattern recognize
2:50 pm
beaver are often distills over to their nongame lies. people who play a lot of tetris start seeing shapes of your and think about how they can fit them together. it shows up in the assassins creed games, the bleeding effect where someone is reliving a simulation of his ancestors lives and takes on their sort of superhuman murder stealth abilities, and that since both unrealistic and undesirable, but it might be desirable to imagine a population trained on games that teach them about spotting surveillance technology in the world around them. and a more useful version of the tetris effect. turning back to the question of encryption, as we heard from sharon bradford franklin earlier, law enforcement have for years now been complaining that the spread of encryption is causing them to go door, making it more difficult to do electronic surveillance
2:51 pm
communications. there's a fascinating report from the center for strategic and international studies that really point out there are a lot of ways that come sort of difficulties law enforcement is having with intercepting electronic communications, readers have a lot to do with the need for backdoors and that there's a lot of low hanging fruit being left on the table that we ought to examine before we talk about legislating breaches, platform for breaches in the tools we rely on that secures. to talk about that i want to invite jennifer daskal to discuss a report which i believe you'll find on the table outside. >> thank you, julian. and thanks to cater for putting on this excellent conference. as julian said, , the focus of y
2:52 pm
talk today is the range of challenges that law enforcement faces and accessing digital evidence separate and apart from the encryption related challenges. and this talk stems from a report that it worked on with co-author will carter under the auspices of the center for strategic and international studies, or what many of us to ask csis. the debates about encryption and deadly will continue but it was, is and is emphatically more so our view after writing this report and working on this report that while in cricket of the debates about encryption are taken out so much of the limelight, there are a range of other challenges that law enforcement faces that need to be dealt with and they can be dealt with relatively easily and they need to be dealt with now. and so, and that these challenges will continue no matter what happens with respect to encryption, no matter if, in
2:53 pm
fact, there ever were a clear encryption may did, there would still be these other ongoing challenges that need to be dealt with. as our title low hanging fruit indicates, is a promise we think can be relatively easily solved, not completely, nothing in this space ever leads to a complete solution and we make a mistake if we assume that we are seeking a complete solution for that we are ever trying to eliminate totally some of the friction in the process picked some of the friction is, in fact, healthy. but some of the friction is unnecessary, and actually collectively harmful to both security and to privacy, and minimizing that friction is not only an laudable goal, but one that is eminently achievable. so to that end i'll just note the report we worked on was endorsed by a number of individuals and also groups and entities. it was endorsed by the former cia director john brennan, former fbi general general
2:54 pm
counsel ken weinstein, two former -- the former boston commissioner, please commissioner ed davis, assistant attorney general for national security. it's been praised by a number of different groups and providers and several providers have already introduced a number of reforms consistent with what we called for in this report. so now that i've given you the hard sell, i'm going to spend the remainder of my time talking about the substance, and talk a little bit about the methodology that we use in doing this report, a little bit about our findings and her ultimate recommendations. so this report stems from about a year's worth of research including a series of qualitative interviews with state, local, and federal law enforcement officials, prosecutors, representatives from a range of different tech companies, and members of the civil society community. it also involved a quantitative survey of state, local, and federal law enforcement officials. the survey results are notable.
2:55 pm
hopefully you can all read at least a little bit of this. the survey, according to the survey results, those surveys found difficulties accessing, analyzing and utilizing digital evidence over a third of the cases. we believe that's the problem that is only going to continue to grow, as digital information becomes more and more ubiquitous and its digital evidence is needed and just about every criminal investigation. this chart shows the response to the question, what is the biggest challenge that your departments encounters in using digital evidence? and accessing data from service providers was ranked as the key challenge amongst our respondents, separate and apart from russians about interpretation. identifying, which service provider has the data was reported as the number one challenge. 30% ranked it as their biggest problem. obtaining the data once it was
2:56 pm
identified was reported as the number two challenge, 25% of our respondents rated as their come as the biggest challenge. accessing data from a device was 19% ranked it as the biggest challenge they face, then collectively, analyzing data from devices and analyzing data from providers that's been disclosed from providers which are two separate things come to mind that's about 21%, said that was the biggest problem. this is important because these are problems that can be fixed or at least largely reduced without huge changes in the system, but with more resources and more dedicated systematic thoughts as to addressing these problems. so to the extent that law enforcement doesn't know where to go to get data of interest, that is a problem that can be solved with better information flows and better training. to the extent that law
2:57 pm
enforcement faces challenges and obtaining data, that is a bigger challenge, and we heard two very different story from the law enforcement officials we talk to in the provider community. the law enforcement officials talked about what they perceived as very long delays in getting information back from service providers, what they perceived as service providers dragging their feet, of service providers having insufficient resources to respond to their needs of requesting slow walks or turned into what they perceived to be invalid circumstances. providers on their side toes and very different story. they complained about what they saw as overbroad requests, about law enforcement asking for things that simply were not available as delays being the fault of law enforcement as they were internally debating and deciding whether or not to get nondisclosure orders that would prohibit the provider from
2:58 pm
telling their customers subscribers that the customer subscribers date had been obtained and providers holding off at law enforcement requests on turning over the data until they learned whether or not they had permission to tell the customer or the subscriber. the data interestingly kind of support both sides of the story. this chart shows the requests that were issued, that the u.s. law enforcement issued the six key companies, they spoke of microsoft, twitter, google, yahoo! and apple. overtime this is based on the companies own transparency reporting. there is no other good source of this data. not surprisingly you see from this chart a pretty dramatic increase in requests over a short period of time. the show request in six-month intervals to excel in a six-month timeframe in december 2013, there are about
2:59 pm
400,000 requests to the six u.s.-based providers. by december 2017, the previous six months before that, almost doubled or least increased by significant amount to about six hundred 50,000, almost 700,000 requests in the prior six-month period. what's interesting about this chart is that the grant rates have hovered more or less at about the same rate, at about 80%. they had been consistent over time in terms of the percentage of request or demand that providers complies with, but that also means that the number, the absolute number -- >> you can see the last few minutes of this program on our website c-span.org. just type surveys in the search bar. taking the next use u.s. senate which today is considering legislation authorizing military assistance to israel, establishing new sanctions on the syrian government, and encouraging state and local governments to oppose efforts to put economic sanctions on
3:00 pm
israel. senate democrats plan to vote against moving toward a vote on this bill. not necessarily because they oppose it, but because they say congress should first take up legislation to reopen the federal government. today is day 18 of the partial government shutdown. a vote is scheduled for 5:30 p.m. eastern and to end the debate move toward a vote on that bill. now lives in a coverage here on c-span2. the chaplain: let us pray. merciful god, enthroned far above all other powers, we need you to exercise your might for our nation during this challenging season. as we wrestle with the stalemate

63 Views

info Stream Only

Uploaded by TV Archive on